Homework 2 Foundations of Algorithms for Massive Datasets ( 236779 ) Fall 2015

نویسنده

  • Nir Ailon
چکیده

Small coherence (or, incoherence) has nice properties as we shall see now. Assume A has coherence μ < 1. Prove that for all s < 1/μ, A has RIP with parameters s, δ = sμ. Hint: For s-sparse x, write down ‖Ax‖ = (Ax)Ax using inner products of columns of A. Note: There exist deterministic constructions of k×d matrices with incoherence ∼ 1/ √ k, which by the above imply RIP with parameters s = √ k, δ = O(1). But we know how to obtain, using random constructions (eg JL, as you did above) matrices of the same size and with RIP of much better parameters s = k/ log d, δ = O(1). This shows a gap between what we can achieve using deterministic vs randomized constructions. For certain applications, incoherence might be enough.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

236779: Foundations of Algorithms for Massive Datasets Lecture 4 the Johnson-lindenstrauss Lemma

The Johnson-Lindenstrauss lemma and its proof This lecture aims to prove the Johnson–Lindenstrauss lemma. Since the lemma is proved easily with another interesting lemma, a part of this lecture is focused on the proof of this second lemma. At the end, the optimality of the Johnson–Lindenstrauss lemma is discussed. Lemma 1 (Johnson-Lindenstrauss). Given the initial space X ⊆ R n s.t. |X| = N , <...

متن کامل

Homework 1 - 236779 Nir

Ci,hi(at)+ = si(at) • Query: Given a ∈ Σ, return f̂a := mediani=1..u{Ci,hi(a)si(a)} As usual, let fa = |{i : ai = a}| (the frequency of symbol a). For simplicity, you can assume that the stream is determined and fixed ahead of time, then you initialize your algorithm (by randomly choosing the hash functions), ond only then you view the stream. Let TOPk ⊆ Σ denote the set of top-k heavy hitters i...

متن کامل

236779 : Foundations of Algorithms for Massive Datasets Nov 11 2015 Lecture

These notes cover the end of the Frequent-items (Batch-Decrement) sketch, the Count-Min sketch, the F2 Tug-of-War sketch (AMS), and initial background for dimensionality reduction and the Johnson-Lindenstrauss transform. 1 Reminder: Frequency Moments We are given a stream (sequence) of N characters (or items) a1, a2, . . . , aN from a large alphabet Σ of size |Σ| = n. Definition 1. A histogram ...

متن کامل

Lecture Notes: Distributed Algorithms

This booklet includes lecture notes from parallel and distributed computing course I taught in Fall 07, Spring 08, and Fall 08 at the Kent State University, USA. I wrote a good part of these notes in Fall 2007; I revise them every time I teach the course. The problems in the notes have been used in class lectures, homework assignments, or exams. You can nd a collection of homework assignments a...

متن کامل

A comparison of public datasets for acceleration-based fall detection.

Falls are one of the leading causes of mortality among the older population, being the rapid detection of a fall a key factor to mitigate its main adverse health consequences. In this context, several authors have conducted studies on acceleration-based fall detection using external accelerometers or smartphones. The published detection rates are diverse, sometimes close to a perfect detector. ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016